28 research outputs found

    Objectively Optimized Earth Observing Systems

    Get PDF

    Artificial Intelligence in Geoscience and Remote Sensing

    Get PDF

    Artificial Intelligence in Aerospace

    Get PDF

    AutoChem

    Get PDF
    AutoChem is a suite of Fortran 90 computer programs for the modeling of kinetic reaction systems. AutoChem performs automatic code generation, symbolic differentiation, analysis, and documentation. It produces a documented stand-alone system for the modeling and assimilation of atmospheric chemistry. Given databases of chemical reactions and a list of constituents defined by the user, AutoChem automatically does the following: 1. Selects the subset of reactions that involve a user-defined list of constituents and automatically prepares a document listing the reactions; 2. Constructs the ordinary differential equations (ODEs) that describe the reactions as functions of time and prepares a document containing the ODEs; 3. Symbolically differentiates the time derivatives to obtain the Jacobian and prepares a document containing the Jacobian; 4. Symbolically differentiates the Jacobian to obtain the Hessian and prepares a document containing the Hessian; and 5. Writes all the required Fortran 90 code and datafiles for a stand-alone chemical modeling and assimilation system (implementation of steps 1 through 5). Typically, the time taken for steps 1 through 5 is about 3 seconds. The modeling system includes diagnostic components that automatically analyze each ODE at run time, the relative importance of each term, time scales, and other attributes of the model

    UV/Vis+ Photochemistry Database : Structure, Content and Applications

    Get PDF
    Acknowledgments This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors. However, the authors are indebted to those colleagues who support us in maintaining the database through the provision of spectral and other photochemical data and information. The National Center for Atmospheric Research is operated by the University Coporation for Atmopsheric Research, under the sponsorship of the National Science Foundation. Disclaimer: The views expressed in this paper are those of the authors and do not necessarily represent the views or policies of the U.S.EPA. Mention of trade names or products does not convey and should not be interpreted as conveying official U.S. EPA approval, endorsement, or recommendation.Peer reviewedPublisher PD

    Representativeness Uncertainty in Chemical Data Assimilation Highlight Mixing Barriers

    No full text
    When performing chemical data assimilation the observational, representativeness, and theoretical uncertainties have very different characteristics. In this study we have accurately characterized the representativeness uncertainty by studying the probability distribution function (PDF) of the observations. The average deviation has been used as a measure of the width of the PDF and of the variability (representativeness uncertainty) for the grid cell. It turns out that for long-lived tracers such as N2O and CH4 the representativeness uncertainty is markedly different from the observational uncertainty and clearly delineates mixing barriers such as the polar vortex edge, the tropical pipe and the tropopause

    Atmospheric Pseudohalogen Chemistry

    No full text
    Hydrogen cyanide is not usually considered in atmospheric chemical models. The paper presents three reasons why hydrogen cyanide is likely to be significant for atmospheric chemistry. Firstly, HCN is a product and marker of biomass burning. Secondly, it is also likely that lightning is producing HCN, and as HCN is sparingly soluble it could be a useful long-lived "smoking gun" marker of lightning activity. Thirdly, the chemical decomposition of HCN leads to the production of small amounts of the cyanide (CN) and NCO radicals. The NCO radical can be photolyzed in the visible portion of the spectrum yielding nitrogen atoms (N). The production of nitrogen atoms is significant as it leads to the titration of total nitrogen from the atmosphere via N+N->N2, where N2 is molecular nitrogen

    Autonomous Learning of New Environments with a Robotic Team Employing Hyper-Spectral Remote Sensing, Comprehensive In-Situ Sensing and Machine Learning

    No full text
    This paper describes and demonstrates an autonomous robotic team that can rapidly learn the characteristics of environments that it has never seen before. The flexible paradigm is easily scalable to multi-robot, multi-sensor autonomous teams, and it is relevant to satellite calibration/validation and the creation of new remote sensing data products. A case study is described for the rapid characterisation of the aquatic environment, over a period of just a few minutes we acquired thousands of training data points. This training data allowed for our machine learning algorithms to rapidly learn by example and provide wide area maps of the composition of the environment. Along side these larger autonomous robots two smaller robots that can be deployed by a single individual were also deployed (a walking robot and a robotic hover-board), observing significant small scale spatial variability

    Decoding Physical and Cognitive Impacts of Particulate Matter Concentrations at Ultra-Fine Scales

    No full text
    The human body is an incredible and complex sensing system. Environmental factors trigger a wide range of automatic neurophysiological responses. Biometric sensors can capture these responses in real time, providing clues about the underlying biophysical mechanisms. In this prototype study, we demonstrate an experimental paradigm to holistically capture and evaluate the interactions between an environmental context and physiological markers of an individual operating that environment. A cyclist equipped with a biometric sensing suite is followed by an environmental survey vehicle during outdoor bike rides. The interactions between environment and physiology are then evaluated though the development of empirical machine learning models, which estimate particulate matter concentrations from biometric variables alone. Here, we show biometric variables can be used to accurately estimate particulate matter concentrations at ultra-fine spatial scales with high fidelity (r2 = 0.91) and that smaller particles are better estimated than larger ones. Inferring environmental conditions solely from biometric measurements allows us to disentangle key interactions between the environment and the body. This work sets the stage for future investigations of these interactions for a larger number of factors, e.g., black carbon, CO2, NO/NO2/NOx, and ozone. By tapping into our body’s ‘built-in’ sensing abilities, we can gain insights into how our environment influences our physical health and cognitive performance

    Holistics 3.0 for Health

    No full text
    Human health is part of an interdependent multifaceted system. More than ever, we have increasingly large amounts of data on the body, both spatial and non-spatial, its systems, disease and our social and physical environment. These data have a geospatial component. An exciting new era is dawning where we are simultaneously collecting multiple datasets to describe many aspects of health, wellness, human activity, environment and disease. Valuable insights from these datasets can be extracted using massively multivariate computational techniques, such as machine learning, coupled with geospatial techniques. These computational tools help us to understand the topology of the data and provide insights for scientific discovery, decision support and policy formulation. This paper outlines a holistic paradigm called Holistics 3.0 for analyzing health data with a set of examples. Holistics 3.0 combines multiple big datasets set in their geospatial context describing as many areas of a problem as possible with machine learning and causality, to both learn from the data and to construct tools for data-driven decisions
    corecore